New AR Tool Turns Household Items into Typing Surfaces
Virtual keyboards have long been a challenge for users of augmented reality systems. They are often slow, visually distracting, and physically uncomfortable, especially when people must keep their hands raised to type in midair. A research group at the University of Texas at Dallas has been exploring ways to make text input more efficient and comfortable. The team believes that better typing methods could help make augmented reality more useful for both casual and professional environments.
The researchers from the Erik Jonsson School of Engineering and Computer Science have developed a new interface known as PropType. This system allows individuals to transform ordinary objects, such as coffee mugs, books, or water bottles, into typing platforms. The technology projects an augmented keyboard onto a handheld object and adapts to various forms, including curved or irregular shapes. By relying on objects that people already keep nearby, the system aims to link the physical and digital worlds while improving interaction.
The concept focuses on using items that are already present within a user’s surroundings, which helps establish a link between tangible surfaces and virtual environments. The tactile nature of the objects provides a level of key confirmation that traditional virtual keyboards lack, reducing dependence on constant visual monitoring. The researchers claim this sensory feedback helps users type more confidently, as they can feel the object they are interacting with rather than pressing invisible keys.
Student researchers in the Multimodal Interaction Lab produced a demonstration video showing how PropType works across a range of objects. The team wants the system to deliver a more natural and immersive experience than traditional augmented reality typing tools. Users often find virtual keyboards distracting because they require constant visual focus and introduce physical fatigue. External physical keyboards can disrupt immersion and are less practical for mobile scenarios. PropType aims to solve both problems by making typing feel integrated and accessible.
To create the system, the researchers examined how individuals naturally handle and use various common objects. Sixteen participants were observed to understand their natural grip positions and typing gestures. Based on this research, the team designed custom keyboard layouts tailored to different object categories. The prototype also features an editing tool that enables users to personalise layouts and visual elements, making text entry more adaptable to personal preference.
PropType earned recognition in April with a Best Paper Honourable Mention at the ACM CHI Conference on Human Factors in Computing Systems held in Yokohama, Japan. The system was later showcased at the ACM Symposium on User Interface Software and Technology in Busan, South Korea, where it continued to receive academic interest.
The project is part of a larger research programme involving haptics, which explores how sensations such as touch, temperature, and physical resistance can make virtual interactions more realistic. The team has gained national and international attention for studies that include thermal-tactile integration and thermal masking. These approaches can create the illusion of warmth or cold in a different area of the body than where the physical stimulus originates. The work suggests that sensory feedback could significantly improve the realism of virtual or augmented environments.
One of the team’s recent presentations at the 2024 ACM CHI Conference focused on combining vibration and temperature to influence perception. This research could eventually support more immersive interfaces for gaming, healthcare training, or virtual simulations where physical realism is essential.
PropType was developed with contributions from co-authors including Hyunjae Gil, a former UT Dallas postdoctoral researcher and now assistant professor at the Daegu Gyeongbuk Institute of Science and Technology, as well as Iniyan Joseph and Ashish Pratap, a computer science doctoral student. The project received funding from the Institute of Information and Communications Technology Planning and Evaluation of South Korea.








